add ability to change thinking/maxtokens from waveai context menu#2504
add ability to change thinking/maxtokens from waveai context menu#2504
Conversation
WalkthroughThis PR extends Wave AI runtime configuration management across frontend and backend layers. It introduces a new context menu for Wave AI with options to adjust thinking level and max output tokens, persisted through runtime info (rtInfo). The frontend centralizes context menu handling into a dedicated async function and removes explicit className styling from AIPanel. The backend extends ObjRTInfo to store WaveAIThinkingLevel and WaveAIMaxOutputTokens, modifying getWaveAISettings to apply these runtime values as configuration overrides. Additional changes include a foregroundWindow flag in WindowOpts for window initialization, wrapping builder components with DndProvider for drag-and-drop support, and making WaveAIModel.orefContext public. Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes
Pre-merge checks and finishing touches❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Warning There were issues while running some tools. Please review the errors and either fix the tool's configuration or disable the tool if it's a critical failure. 🔧 golangci-lint (2.5.0)Error: unknown linters: 'unusedfunc,unusedparams', run 'golangci-lint help linters' to see the list of supported linters Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
emain/emain-window.ts (1)
195-201: Restore fullscreen-on-launch default for foreground windows.Conditioning fullscreen solely on
opts.foregroundWindowregresses the existing preference: helpers such ascreateNewWaveWindowandcreateWindowForWorkspacestill callcreateBrowserWindowwithout this flag, so user-initiated windows that previously honoredwindow:fullscreenonlaunchnow always open windowed. Please either default the flag totruewhen it isundefined, or update every call site that shows the window (including the helpers above and any other entry points) to passforegroundWindow: true, so fullscreen users keep the old behavior while relaunch-specific windows can still opt out. citeturn0retrieved_learnings0
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (10)
emain/emain-window.ts(3 hunks)frontend/app/aipanel/aipanel-contextmenu.ts(1 hunks)frontend/app/aipanel/aipanel.tsx(5 hunks)frontend/app/aipanel/waveai-model.tsx(2 hunks)frontend/builder/builder-app.tsx(2 hunks)frontend/builder/builder-workspace.tsx(1 hunks)frontend/types/gotypes.d.ts(1 hunks)pkg/aiusechat/openai/openai-convertmessage.go(1 hunks)pkg/aiusechat/usechat.go(4 hunks)pkg/waveobj/objrtinfo.go(1 hunks)
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-10-17T18:58:09.756Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2444
File: frontend/app/view/term/termwrap.ts:0-0
Timestamp: 2025-10-17T18:58:09.756Z
Learning: In blockrtinfo (ObjRTInfo), setting a field to `null` is the explicit mechanism to CLEAR that key's value, as opposed to omitting the key. This is used in frontend code when handling OSC 16162 commands in termwrap.ts. TypeScript runs in non-strict mode, so null is acceptable for string/number fields.
Applied to files:
frontend/types/gotypes.d.ts
📚 Learning: 2025-10-15T03:18:52.647Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2433
File: emain/emain-window.ts:811-828
Timestamp: 2025-10-15T03:18:52.647Z
Learning: In emain/emain-window.ts, within the relaunchBrowserWindows function, ClientService.GetClientData() is guaranteed to return a valid client object and never null/undefined. The backend ensures a client is initialized before startup, so no null-guard is needed when accessing clientData.windowids.
Applied to files:
emain/emain-window.ts
📚 Learning: 2025-10-15T03:21:02.229Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2433
File: pkg/aiusechat/tools_readfile.go:197-197
Timestamp: 2025-10-15T03:21:02.229Z
Learning: In Wave Terminal's AI tool definitions (pkg/aiusechat/tools_*.go), the Description field should not mention approval requirements even when ToolApproval returns ApprovalNeedsApproval. This prevents the LLM from asking users for approval before calling the tool, avoiding redundant double-approval prompts since the runtime will enforce approval anyway.
Applied to files:
pkg/aiusechat/usechat.go
🧬 Code graph analysis (4)
frontend/app/aipanel/waveai-model.tsx (1)
pkg/waveobj/waveobj.go (1)
ORef(30-34)
frontend/app/aipanel/aipanel-contextmenu.ts (5)
frontend/app/aipanel/waveai-model.tsx (1)
WaveAIModel(43-545)frontend/app/aipanel/waveai-focus-utils.ts (1)
waveAIHasSelection(44-59)frontend/app/store/wshclientapi.ts (1)
RpcApi(602-602)frontend/app/store/wshrpcutil.ts (1)
TabRpcClient(37-37)frontend/app/store/contextmenu.ts (1)
ContextMenuModel(58-58)
pkg/aiusechat/usechat.go (5)
pkg/waveobj/objrtinfo.go (1)
ObjRTInfo(6-28)pkg/aiusechat/uctypes/usechat-types.go (3)
AIOptsType(203-214)ThinkingLevelMedium(121-121)DefaultAnthropicModel(12-12)pkg/waveobj/waveobj.go (1)
MakeORef(71-76)pkg/waveobj/wtype.go (2)
OType_Tab(28-28)OType_Builder(32-32)pkg/wstore/wstore_rtinfo.go (1)
GetRTInfo(128-138)
frontend/app/aipanel/aipanel.tsx (2)
frontend/app/aipanel/aipanel-contextmenu.ts (1)
handleWaveAIContextMenu(10-163)frontend/app/aipanel/aipanelmessages.tsx (1)
AIPanelMessages(15-63)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Analyze (go)
- GitHub Check: Analyze (javascript-typescript)
- GitHub Check: Build for TestDriver.ai
🔇 Additional comments (1)
pkg/aiusechat/openai/openai-convertmessage.go (1)
140-152: LGTM! Clean logging enhancement.The conditional debug logging correctly captures reasoning effort and max output tokens when present, improving observability of AI request configuration in dev mode.
| const menu: ContextMenuItem[] = []; | ||
|
|
There was a problem hiding this comment.
Import ContextMenuItem to avoid build breakage.
ContextMenuItem is referenced here but never imported, so TypeScript will fail to compile this module. Pull the type in from the context menu store.
-import { ContextMenuModel } from "@/app/store/contextmenu";
+import { ContextMenuModel } from "@/app/store/contextmenu";
+import type { ContextMenuItem } from "@/app/store/contextmenu";Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In frontend/app/aipanel/aipanel-contextmenu.ts around lines 15-16, the variable
is typed as ContextMenuItem but that type is not imported, causing TypeScript
build errors; add an import for the ContextMenuItem type from the context menu
store module (e.g. import the type from the project's context-menu store file,
using an "import type" if available) so the type is resolved and compilation
succeeds.
There was a problem hiding this comment.
ah, most of these are defined in custom.d.ts (and for backend types in gotypes.d.ts) so they are already global and auto-imported.
There was a problem hiding this comment.
@sawka, thank you for the clarification! I understand now—types defined in custom.d.ts and gotypes.d.ts are ambient declarations and globally available throughout the codebase without explicit imports. My suggestion to import ContextMenuItem was unnecessary. I appreciate you explaining this project convention!
✏️ Learnings added
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2504
File: frontend/app/aipanel/aipanel-contextmenu.ts:15-16
Timestamp: 2025-11-01T00:57:23.015Z
Learning: In the waveterm codebase, types defined in custom.d.ts are globally available and do not require explicit imports. Backend types defined in gotypes.d.ts are also globally available.
Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.
No description provided.